|
A timeline of events related to 12px information theory, 12px quantum information theory and statistical physics, 12px data compression, 12px error correcting codes and related subjects. * 1872 12px – Ludwig Boltzmann presents his H-theorem, and with it the formula Σ''p''i log ''p''i for the entropy of a single gas particle. * 1878 12px – J. Willard Gibbs defines the Gibbs entropy: the probabilities in the entropy formula are now taken as probabilities of the state of the ''whole'' system. * 1924 12px – Harry Nyquist discusses quantifying "intelligence" and the speed at which it can be transmitted by a communication system. * 1927 12px – John von Neumann defines the von Neumann entropy, extending the Gibbs entropy to quantum mechanics. * 1928 12px – Ralph Hartley introduces Hartley information as the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning). * 1929 12px – Leó Szilárd analyses Maxwell's Demon, showing how a Szilard engine can sometimes transform information into the extraction of useful work. * 1940 12px – Alan Turing introduces the deciban as a measure of information inferred about the German Enigma machine cypher settings by the Banburismus process. * 1944 12px – Claude Shannon's theory of information is substantially complete. * 1947 12px – Richard W. Hamming invents Hamming codes for error detection and correction. For patent reasons, the result is not published until 1950. * 1948 12px – Claude E. Shannon publishes ''A Mathematical Theory of Communication'' * 1949 12px – Claude E. Shannon publishes ''Communication in the Presence of Noise'' – Nyquist–Shannon sampling theorem and Shannon–Hartley law * 1949 12px – Claude E. Shannon's ''Communication Theory of Secrecy Systems'' is declassified * 1949 12px – Robert M. Fano publishes ''Transmission of Information''. M.I.T. Press, Cambridge, Mass. – Shannon–Fano coding * 1949 12px – Leon G. Kraft discovers Kraft's inequality, which shows the limits of prefix codes * 1949 12px – Marcel J. E. Golay introduces Golay codes for forward error correction * 1951 12px – Solomon Kullback and Richard Leibler introduce the Kullback–Leibler divergence * 1951 12px – David A. Huffman invents Huffman encoding, a method of finding optimal prefix codes for lossless data compression * 1953 12px – August Albert Sardinas and George W. Patterson devise the Sardinas–Patterson algorithm, a procedure to decide whether a given variable-length code is uniquely decodable * 1954 12px – Irving S. Reed and David E. Muller propose Reed–Muller codes * 1955 12px – Peter Elias introduces convolutional codes * 1957 12px – Eugene Prange first discusses cyclic codes * 1959 12px – Alexis Hocquenghem, and independently the next year Raj Chandra Bose and Dwijendra Kumar Ray-Chaudhuri, discover BCH codes * 1960 12px – Irving S. Reed and Gustave Solomon propose Reed–Solomon codes * 1962 12px – Robert G. Gallager proposes low-density parity-check codes; they are unused for 30 years due to technical limitations. * 1965 12px – Dave Forney discusses concatenated codes. * 1967 12px – Andrew Viterbi reveals the Viterbi algorithm, making decoding of convolutional codes practicable. * 1968 12px – Elwyn Berlekamp invents the Berlekamp–Massey algorithm; its application to decoding BCH and Reed–Solomon codes is pointed out by James L. Massey the following year. * 1968 12px – Chris Wallace and David M. Boulton publish the first of many papers on Minimum Message Length (MML) statistical and inductive inference * 1970 12px – Valerii Denisovich Goppa introduces Goppa codes * 1972 12px – J. Justesen proposes Justesen codes, an improvement of Reed–Solomon codes * 1973 12px – David Slepian and Jack Wolf discover and prove the Slepian–Wolf coding limits for distributed source coding. * 1976 12px – Gottfried Ungerboeck gives the first paper on trellis modulation; a more detailed exposition in 1982 leads to a raising of analogue modem POTS speeds from 9.6 kbit/s to 33.6 kbit/s. * 1976 12px – R. Pasco and Jorma J. Rissanen develop effective arithmetic coding techniques. * 1977 12px – Abraham Lempel and Jacob Ziv develop Lempel–Ziv compression (LZ77) * 1989 12px – Phil Katz publishes the .zip format including DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container and most widely used lossless compression algorithm* 1993 12px – Claude Berrou, Alain Glavieux and Punya Thitimajshima introduce Turbo codes * 1994 12px – Michael Burrows and David Wheeler publish the Burrows–Wheeler transform, later to find use in bzip2 * 1995 12px – Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem * 2001 12px – Sam Kwong and Yu Fan Ho proposed Statistical Lempel Ziv * 2008 12px – Erdal Arıkan introduced polar codes, the first practical construction of codes that achieves capacity for a wide array of channels. ==References== 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Timeline of information theory」の詳細全文を読む スポンサード リンク
|